Attention (machine learning)
MACHINE LEARNING TECHNIQUE
Attention mechanism; Dot-product attention; Multi-head attention; Attention unit; Attention Is All You Need
In neural networks, attention is a technique that mimics cognitive attention. The effect enhances some parts of the input data while diminishing other parts — the motivation being that the network should devote more focus to the small, but important, parts of the data.